专利摘要:
CAPTURE PROJECTION SYSTEM, EXECUTIBLE MEANS OF PROCESSING AND COLLABORATION METHOD IN WORKPLACE? In one example, a capture projection system includes: a controller; a workspace camera operatively connected to the controller for capturing still images and video images of an object in a workspace; and a projector operatively connected to the controller. The controller is configured to control the workspace camera and the projector to capture an image of a real object in the workspace and to project the image of the object into the workspace. In another example, a workspace collaboration method includes: capturing a digital image of a real object in a first workspace; simultaneously projecting the image of the object in multiple workspaces, including the first workspace; capturing a digital image of an image of the object will change while it is changed in one of the workspaces; and simultaneously project the altered object image across multiple workspaces, including the first workspace.
公开号:BR112014002186B1
申请号:R112014002186-4
申请日:2011-11-02
公开日:2020-12-29
发明作者:David Bradley Short
申请人:Hewlett-Packard Development Company, L.P;
IPC主号:
专利说明:

Technical field
[001] Various types of mixed reality systems have been developed to produce new environments where real and virtual objects coexist and interact in real time. Virtual whiteboards and other types of remote collaboration systems have also been developed to allow remote users to share and manipulate information simultaneously in multiple locations. Description of the drawings
[002] Figures 1A and 1B are external views in perspective illustrating an example of a new capture projection system. In figure 1A, the image of a two-dimensional object (a printed photograph) was captured and displayed. In figure 1B, the image of a three-dimensional object (a cube) was captured and displayed;
[003] Figure 2 is an internal perspective view illustrating an example of a new capture projection system;
[004] Figure 3 is a block diagram of the capture projection system shown in figure 2;
[005] Figure 4 is a block diagram illustrating an example of a user input device in the system shown in figures 2 and 3;
[006] Figures 5 and 6 are elevated side and front views, respectively, illustrating the positioning of the camera and the projector in the capture projection system presented in figures 2 and 3;
[007] Figures 7 to 11 are a progression of side elevation views showing various positions for the projector and the camera in a capture projection system, illustrating some of the problems associated with moving the reflex point out of the capture area the camera;
[008] Figures 12 and 13 illustrate an example of the camera in the capture projection system presented in figures 2 and 3;
[009] Figure 14 illustrates an example of the projector in the capture projection system shown in figures 2 and 3;
[0010] Figures 15 and 16 illustrate examples of the user input device in the capture projection system presented in figures 2 and 3;
[0011] Figures 17 to 19 are perspective views illustrating an example of a new portable capture projection device;
[0012] Figures 20 to 22 illustrate three examples of scenarios for the use of a capture projection system with other devices;
[0013] Figure 23 is a block diagram illustrating an example of a new capture projection device that includes object recognition and audio / video teleconferencing functions;
[0014] Figure 24 is a block diagram illustrating an example of architecture for the implementation of a capture projection device, such as that shown in figure 23, in a collaborative environment;
[0015] Figure 25 is a block diagram illustrating an example of a controller for the implementation of an overlay technique in which real and virtual objects are treated as visually interchangeable logical layers;
[0016] Figure 26 shows a capture projection system in which real pieces are placed on a virtual checkerboard projected on the work surface;
[0017] Figure 27 illustrates an example of overlapping on the Z axis for real and virtual objects in the system in Figure 26; and
[0018] Figure 28 is a flow chart illustrating an example of a method for implementing an overlay technique in which real and virtual objects are treated as visually interchangeable logical layers.
[0019] The same part numbers designate the same or similar parts in all figures. Detailed Description
[0020] The examples presented in the figures and described below illustrate, but do not limit the invention, which is defined in the Claims that follow this Description.
[0021] A new capture projection system has been developed to improve the interactive user experience working with real objects and objects projected on a physical work surface and to improve virtual collaboration between multiple remote users. The new system can be implemented, for example, on one or more independent portable devices connected to a common work surface. A digital camera, a projector and control programming are stored together in a desktop unit that allows an augmented virtual reality projection, in which real and projected / virtual objects can be manipulated and shared simultaneously between multiple remote users. Such portable devices can be connected almost anywhere at any time for interactive collaboration via a comparatively inexpensive platform suitable not only for larger environments, corporate businesses, but also for small businesses and even personal customers.
[0022] As used in this document, a “real” object means an object that is not displayed, projected or otherwise rendered as an image; and a “virtual” object means an object that is displayed, projected or otherwise rendered as an image.
[0023] Examples of a new capture projection system and portable capture projection devices will be described first with reference to figures 1 to 19. Examples of the implementation of the new capture projection system and devices in a collaborative environment will then be described with reference to figures 20 to 28. Capture projection system and devices
[0024] Figures 1A and 1B are external views in perspective illustrating an example of a new capture projection system 10 and an interactive workspace 12 associated with a system 10. Figure 2 is a perspective view illustrating an example of a capture projection system 10 with outer housing 13 removed. Figure 3 is a block diagram of system 10 shown in figure 2. With reference to figures 1A, 1B, 2, and 3, the capture projection system 10 includes a digital camera 14, a projector 16, and a controller 18 The camera 14 and the projector 16 are operatively connected to the controller 18 so that the camera 14 captures an image of an object 20 in the workspace 12 and for the projector 16 to project the image of the object 22 in the workspace 12 and, in some instances, for camera 14 to capture an image of the projected image of object 22. The bottom of compartment 13 includes a transparent window 21 over projector 16 (and infrared camera 30).
[0025] In the example shown in figure 1A, a two-dimensional object 20 (a printed photograph) placed on the work surface 24 in the workspace 12 was photographed by camera 14 (figure 2), the object 20 was removed to the side of the space 12, and the image of the object 22 was projected onto a work surface 24 where it can be photographed by camera 14 (figure 2) and / or otherwise manipulated by a user and redesigned in the workspace 12. In the example shown in figure 1B, a three-dimensional object 20 (a cube) placed on the work surface 24 was photographed by the camera 14 (figure 2), the object 20 was removed to the side of the workspace 12, and the image of the object 22 was designed in workspace 12 where it can be photographed by camera 12 and / or otherwise manipulated by a user and redesigned in workspace 12.
[0026] In an implementation example for system 10, controller 18 is programmed and projector 16 is configured to project the image of object 22 at the same position in workspace 24 as the position of object 20 when its image was captured by camera 14. In this way, a digital duplicate 22 on a one-to-one scale of an object 20 can be projected onto the original, allowing the digital duplicate in its place to be manipulated, moved, and otherwise altered as desired by a local user or by multiple remote users collaborating in the same designed workspace 12. The projected image can also be moved away from the original, allowing a user to work with the original and the duplicate together in the same workspace 12.
[0027] System 10 also includes a user input device 26 that allows the user to interact with system 10. A user can interact with object 20 and / or the image of object 22 in workspace 12 via of the input object 26, the image of the object 22 being transmitted to other workspaces 12 on remote systems 10 (not shown) for collaborative user interaction, and, if desired, the image of the object 22 can be photographed by camera 14 and redesigned in local and / or remote workspaces 12 for additional user interaction. In figure 1A, the work surface 24 is part of the table or other underlying support structure 23. In figure 1B, the work surface 24 is on a portable mat 25 that can include touch sensitive areas. In figure 1A, for example, a user control panel 27 is designed on a work surface 24, while in figure 1B the control panel 27 can be embedded in a touch sensitive area of carpet 25. Similarly, a positioning area for documents 29 of A4 size, letter or other standard size can be projected onto a work surface 24 in figure 1A or printed on a mat 25 in figure 1B. Obviously, other configurations for a work surface 24 are possible. For example, it may be desirable in some applications that system 10 uses an otherwise white mat 25 to control the color, texture or other characteristics of a work surface 24, and thus the control panel 27 and the work area. document positioning 29 can be projected on the blank mat 25 in figure 1B in the same way as those on table 23 in figure 1 A.
[0028] In the example shown in figure 4, user input device 26 includes an infrared digital pen 28 and an infrared camera 30 to detect pen 28 in workspace 12. Although any suitable user input device can be used, a digital pen has the advantage of allowing entry in three dimensions, including along a work surface 24, without a sensor block or other special surface. In this way, system 10 can be used on a greater variety of work surfaces 24. In addition, the common horizontal orientation of a work surface 24 makes it useful for many common tasks. The ability to use traditional writing instruments on a work surface 24 is advantageous over vertical or mobile computational interfaces. Designing an interactive display on a worktable mixes computerized tasks with the normal objects on a normal desktop, so physical objects can coexist with projected objects. As such, the comfort of using real writing instruments, in addition to their digital counterparts (like the pen 28) is an efficient model. A three-dimensional digital pen without a pad allows notes to be placed on or next to physical objects without the sensor pad preventing the use of traditional instruments on a work surface 24.
[0029] In an implementation example for system 10, projector 16 serves as a light source for camera 14. The capture area of camera 32 (figure 12) and the display area of projector 34 (figure 14) overlap on a work surface 24. In this way, substantial operational efficiency can be achieved using the projector 16 for both projecting images and lighting for the camera. The light path from the projector 16 through the workspace 12 to the work surface 24 must be positioned in relation to the camera 14 to allow user interaction with the display with minimal shadow occlusion, while avoiding reflections from the work surface 24 and objects in the workspace 12 that would otherwise blind the camera 14. The system configuration described below avoids the reflection-induced artifacts that would result from a lighting geometry for the conventional camera, while while maintaining a sufficiently sharp angle of incidence for the light path of the desired projector for adequate lighting and projection of two- and three-dimensional objects in the workspace 12.
[0030] Ideally, the projector 16 would be mounted directly on the work space 12 at an infinite height above the work surface 24 to guarantee parallel rays of light. This configuration, of course, is not realistic. Even if the projector 16 were lowered to a realistic height above the work surface 24 (but still pointing directly downwards), the light from the projector would be reflected on shiny or semi-shiny surfaces and objects directly back to the camera 14, creating a reflection blinding speculate. Thus, the reflection point must be moved out of the camera's capture area 32. (Specular reflection refers to reflections originating from specular reflection in which the incident light beam angle and the reflection angle of the light ray reflected light are equal, and the incidence, reflection and normal directions are coplanar).
[0031] To achieve a commercially reasonable solution to this specular reflection problem, camera 14 and projector 16 are moved away from the center of the capture and display areas 32, 34 and projector 16 is positioned low, close to base 36 , as shown in figures 5 and 6, and a folding mirror 38 is inserted into the projector's light path to simulate a projector position high above the work surface 24. The simulated position of the projector 16 and the corresponding light path above the mirror 38 are shown in dashed lines in figures 5 and 6. However, before describing the configuration shown in figures 5 and 6 in more detail, it is useful to consider the problems associated with other possible configurations for the displacement of the reflex point out of the camera capture area 32.
[0032] In figure 7, camera 14 is positioned in the center of capture area 32 with a suspended projector 16 slightly off center, so that camera 14 does not block the light path of the projector. In the configuration in figure 7, the specular reflection point 39 (at the intersection of the incident light beam 41 and the reflected light beam 43) is within the capture area 32 and therefore will blind the camera 14 to some objects and images in the capture area 32. Additionally, for the configuration shown in figure 7, where the camera 14 and the projector 16 are both positioned far above the base, system 10 would be weighed at the top and thus not desirable for an implementation commercial product. If the projector 16 is positioned to the side at the necessary distance to move the reflex point 39 out of the capture area of the camera 32, as shown in figure 8, the corresponding displacement of the required projector lenses would not be practicable. In addition, any product implementation for system configuration 10 shown in figure 8 would be undesirably large and heavy at the top.
[0033] Moving the camera 14 out of the center over the capture area 32 approaches the projector 16 to make the system less wide, as shown in figure 9, but the displacement of the projector lens is still very large and the product is still heavy at the top. In the configuration shown in figure 10, the projector 16 is raised to a height where it can be brought close enough for an acceptable lens shift, but, of course, the product is now very tall and heavy at the top. The most desirable solution is a “folded” light path for projector 16, shown in figures 5 and 11, in which the “tall and firm” configuration in figure 10 is simulated using a folded mirror 38. In figures 5 and 11, the projector 16 and the upper light path are bent over the reflective surface of the mirror 38 to project the same light path onto the work surface 24 than in the configuration in figure 10. This bending effect is best seen in figure 5, where the bend angles are θ1 = θ2 and Φ1 = Φ2.
[0034] As shown in figures 5 and 6, the camera 14 is positioned in front of the mirror 38 on the work space 12, so that it does not block the light path of the projector. Camera 14 is positioned off-center in the Y direction (figure 5) as part of the overall geometry to keep reflex point 39 outside capture area 32 with an acceptable offset for both camera 14 and projector 16. Projector 16 is focused on mirror 38 so that the light from the projector 16 is reflected by the mirror 38 into the working space 12. When moving the projector 16 down and inserting a folded mirror 38 into the light path of the projector, the reflection point 39 is kept outside capture area 32 with an acceptable displacement of the projector and system 10 is sufficiently narrow, low and stable (not heavy on top) to support a commercially attractive product implementation.
[0035] Thus, and with reference again to figures 1A, 1B, and 2, the components of system 10 can be housed together as a single device 40. With reference also to figure 3, to help implement system 10 as a integrated independent device 40, controller 18 can include processor 42, memory 44, and input / output 46 housed together in device 40. Input / output 46 allows device 40 to receive information from and send information to a external device, as described below with reference to figures 20 to 22. While input / output 46 is shown in figure 3 as part of controller 18, some or all inputs / outputs 46 can be separated from controller 18.
[0036] For the configuration of controller 18 shown in figure 3, the system programming to control and coordinate the functions of camera 14 and projector 16 can reside substantially in the memory of controller 44 for execution by processor 42, thus allowing an independent device 40 and reducing the need for special programming of camera 14 and projector 16. Programming for controller 18 can be implemented in any suitable form of processor executable medium, including one or more software modules, hardware modules, hardware for purposes special (eg application specific hardware, application specific integrated circuits (ASICs), embedded controllers, wired circuits, etc.), or some combination of these. Furthermore, while other configurations are possible, for example, where controller 18 is formed in whole or in part using a computer or remote server from camera 14 and projector 16, a compact independent device, such as device 40 shown in the figures 1A, 1B and 2 offers the user complete functionality in a compact and integrated mobile device 40.
[0037] Referring now to figure 12, camera 14 is positioned in front of mirror 38 above work space 12 in a location offset from the center of capture area 32. As noted above, this position shifted to camera 14 helps to avoid specular reflections when photographing objects in workspace 12 without blocking the light path from the projector 16. While camera 14 generally represents any digital camera suitable for selective capture of still images and videos in workspace 12, it is expected that a high resolution digital camera is used in most applications for system 10. A “high resolution” digital camera, as used in this document, means a camera having a sensor array of at least 12 megapixels. Lower resolution cameras may be acceptable for some basic scanning and copying functions, but resolutions below 12 megapixels are currently not suitable for generating a digital image sufficiently detailed for a full range of manipulative and collaborative functions. Small, high quality digital cameras with high resolution sensors are now very common and commercially available from a variety of camera manufacturers. A high-resolution sensor coupled with the high-performance digital signal processing (DSP) chips available on many digital cameras provides sufficiently fast image processing times, for example, a trigger time and preview of less than one second, to provide acceptable performance for most system applications 10.
[0038] With reference now also to figure 13, in the example shown, camera sensor 50 is oriented in a plane parallel to the plane of the work surface 24 and the light is focused on sensor 50 through a moving lens ] 52. This configuration of sensor 50 and lens 52 can be used to optically correct trapezoidal distortions, without digital trapezoidal correction in the object image. The field of view of the camera 14 defines a three-dimensional capture space 51 in the working space 12 within which the camera 14 can efficiently capture images. Capture space 51 is delimited in X and Y dimensions by the capture area of camera 32 on the work surface 24. Lens 52 can be optimized for a fixed distance, fixed focus, and fixed zoom corresponding to capture space 51.
[0039] With reference to figure 14, the projector 16 is positioned near the base 36 outside the display area of the projector 34 and focused on the mirror 38 so that the light coming from the projector 16 is reflected by the mirror 38 in the workspace 12 Projector 16 and mirror 38 define a three-dimensional display space 53 in workspace 12 within which projector 16 can efficiently display images. The display space of the projector 53 overlaps the capture space of the camera 51 (figure 12) and is delimited in the X and Y dimensions by the display area 34 on the work surface 24. While the projector 16 generally represents any video projector adequate light, compact size and energy efficiency of a DLP (digital light processing) projector based on LED or laser will be desirable for most system 10 applications. Projector 16 can also employ a lens with movement to allow full optical trapezoidal correction in the projected image. As noted above, the use of a mirror 38 increases the length of the projector's effective light path, imitating a suspended placement of the projector 16, while still allowing a commercially reasonable height for an integrated independent device 40.
[0040] An example of characteristics suitable for system 10 as an independent device 40 is shown in Table 1. (The dimension references in Table 1 are to Figures 5 and 6.) Table 1

[0041] Since the projector 16 acts as the light source for the camera 12 for capturing still images and video, the light from the projector must be bright enough to superimpose any ambient light that may cause specular reflection defects. It has been determined that a projector light of 200 lumens or more will be strong enough to override ambient light for the normal desktop application for system 10 and device 40. For real-time video capture and video collaboration, the projector 16 emits white light over the workspace 12 to illuminate the object (s) 20. For an LED projector 16, the sequencing times of the red, green and blue LEDs that make up the white light are synchronized with the rate of video frames from the camera 14. The refresh rate of the projector 16 and each LED subframe update period should be an integral number of the camera's exposure time for each captured frame to avoid “rainbow banding” and others unwanted effects on the video image. In addition, the camera's video frame rate should be synchronized with the frequency of any ambient fluorescent light that normally flashes twice the frequency of the AC line (for example, 120Hz for a 60Hz AC power line). An ambient light sensor can be used to sense the frequency of the ambient light and adjust the video frame rate for the camera 14 accordingly. For capturing still images, the projector's red, green and blue LEDs can be turned on simultaneously so that the camera's flash increases the brightness of the light in the workspace 12, helping to superimpose ambient light and allowing faster shutter speeds and / or smaller openings to reduce image noise.
[0042] The configuration example for system 10 integrated with an independent device 40 shown in the figures and described above achieves a desirable balance between product size, performance, usability, and cost. The folded light path for the projector 16 reduces the height of the device 40 while maintaining an efficient positioning of the projector well above the working space 12 to prevent specular reflection in the camera capture area 12. The light path of the The projector focuses the work surface 24 horizontally at a sharp angle, allowing the image capture of 3D objects. This combination of a longer path and a sharp angle minimizes the incidence of light across the capture area to maximize the uniformity of light for the flash camera flash. In addition, the folded light path allows the positioning of the projector 16 close to the base 36 for product stability.
[0043] Input devices and techniques suitable for use in system 10 include, for example, finger touch, touch gestures, air gestures, speech recognition, head tracking and eye tracking. A touch-sensitive mat can be used to allow a multi-touch interface for navigating a graphical user interface, or to perform intuitive gestural actions such as pushing, shaking, dragging, rolling, pinching to zoom and two-finger rotation. Depth cameras using structured light, escape time, disturbed light pattern, or stereoscopic vision can also be used to allow airborne gestures or limited touch detection and touch gestures without a touch mat. A touchless digital pen is particularly well suited as a user input 26 for system 10. Thus, in the example shown in the figures, user input 26 includes an infrared digital pen 28 and an infrared camera 30 for detecting the pen 28 in the working space 12. As noted above, a digital touch pen has the advantage of allowing entry in three dimensions, including along the work surface 24, without a sensor mat or other special surface.
[0044] Referring now to figures 4 and 15, the input device 26 includes the infrared pen 28, the infrared camera 30 and a pen charging station 54. The pen 28 includes an infrared lamp 56, a sensitive tip switch touch switch 58 to automatically turn the lamp 56 on and off based on touch, and a manual on / off switch 60 to manually turn the light 56 on and off. (Tip switch 58 and manual switch 60 are shown in the diagram of blocks in figure 4.) Lamp 56 can be positioned, for example, on the tip of pen 28 as shown in figure 15 to help maintain a clear line of sight between camera 30 and lamp 56. Lamp 56 can also emit visible light to help the user determine whether the lamp is on or off.
[0045] Tip switch 58 can be touch sensitive to about 2gr of force, for example, to simulate a traditional writing instrument. When the pen switch touches the work surface 24 or another object, the tip switch 58 detects the contact and turns on the lamp 56. The fact that the lamp 56 is turned on is detected by the camera 30 which signals a touch contact event (similar to a mouse button click or a touch of a finger on a touch-sensitive mat). The camera 30 continues to signal contact, tracking any movement of the pen 28, as long as the lamp 56 remains on. The user can slide the pen 28 across any surface like a physical pen to trace the surface or activate control functions. When the pen tip is no longer in contact with an object, the lamp 56 is turned off and the camera 30 signals no contact. Manual light switch 60 can be used to signal a non-touch event. For example, when working in a three-dimensional workspace 12, the user may want to modify, alter or otherwise manipulate an image projected above the work surface 24 by manually signaling a “virtual” contact event.
[0046] The infrared camera 30 and the mirror 38 define a three-dimensional infrared capture space 61 in the working space 12 within which the infrared camera 30 can effectively detect the light of the pen 28. The capture space 61 is limited in X dimensions and Y by an infrared camera capture area 62 on the work surface 24. In the example shown, as best observed by comparing figures 14 and 15, the capture space of the infrared camera 61 is coextensive with the display space 53 of the projector. In this way, the infrared camera 30 can capture pen activation anywhere in the display space 53.
[0047] In an implementation example shown in figure 16, camera 30 is integrated into the projection light path so that the projector's field of view and the infrared camera's field of view are coincident to help ensure that the pen 28 and therefore the signal tracking from the infrared camera 30 is properly aligned with the projector display anywhere in the workspace 12. With reference to figure 16, the visible light 64 generated by red, green and blue LEDs 66, 68, and 70 on the projector 16 passes through various optical elements 72 (including a moving lens 74) to the mirror 38 (figure 14). The infrared light 75 of the pen 28 in the workspace 12 reflected by the mirror 38 towards the projector 16 is directed to the sensor of the infrared camera 76 by an infrared ray divider 78 through a lens with movement 80. (Similar to the example of configuration for the camera 14 described above, the infrared light sensor 76 for the camera 30 can be oriented in a plane parallel to the plane of the work surface 24 and the light focused on the sensor 76 through a lens with movement 80 for trapezoidal correction total optics).
[0048] It may be desirable, for some commercial implementations, to house the projector 16 and the infrared camera 30 together in a single compartment 82 as shown in figure 16. The geometric configuration for the infrared camera 30 shown in figure 16 helps to ensure that the pen tracking signal is aligned with the display, no matter how high the pen 28 is above the work surface 24. If the projector's field of view and the infrared camera's field of view do not match, it can be difficult to calibrate tracking the pen at more than one height above the work surface 24, creating the risk of a parallax shift between the desired pen input position and the resulting displayed position.
[0049] Although workspace 12 is generally expected to include a physical work surface 24 to support an object 20, workspace 12 can also be implemented as a fully designed workspace without a physical work surface. Additionally, workspace 12 can be implemented as a three-dimensional workspace for working with two-dimensional and three-dimensional objects or as a two-dimensional workspace for working with only two-dimensional objects. While the configuration of workspace 12 will generally be largely determined by the hardware and programming elements of the system 10, the configuration of workspace 12 can also be affected by the characteristics of a physical work surface 24. Thus, in some examples for system 10 and device 40, it may be appropriate to consider that workspace 12 is part of system 10 in the sense that the virtual workspace accompanies system 10 to be manifested in a space of physical work when device 36 is operational, and in other examples, it may be appropriate to consider that workspace 12 is not part of system 10.
[0050] Figures 17 to 19 are perspective views illustrating another example of a portable capture projection device 40 and an interactive workspace 12 associated with device 40. With reference to figures 17 to 19, portable device 40 includes a digital camera 14 for capturing still and video images of an object 20 in capture area 32 (and capture space 51) and a projector 16 for lighting an object in capture area 32 (and capture space 51) and for project images in display area 34 (and in a display space 53). A two-dimensional object 20 (a printed photograph) positioned in the capture area 32 was photographed by camera 14 (figures 17 and 18), an object 20 removed from the capture area 32, and an image of the object 22 projected in the display area 34 ( figure 19) where it can be photographed by camera 14 and / or otherwise manipulated by a user.
[0051] In this example, device 40 also includes a display 84 to selectively display a live broadcast from camera 14, an image previously captured by camera 14, or the representation of an image as it is manipulated by the user through an interface graphical user interface (GUI) 86 projected in display space 53. (GUI 86 is projected in display area 32 in the example shown in figures 17 to 19.) Camera 14, projector 16, and display 84 are operatively connected together via a controller 18 and housed together in compartment 13 as a single portable device 40. Projector 16 is positioned below camera 14, elevated in compartment 13 to project light directly into display space 53 and the display area 34. The projector's display space 53 and the display area 34 overlap the camera's capture space 51 and capture area 32 so that the projector 16 can serve as the light source p for camera 14 to capture images of real objects 20 in space 51 and area 32 and so that camera 14 can capture images of objects 20 projected in space 51 and area 32.
[0052] Controller 18 is programmed to generate and projector 16 projects a GUI 86 that includes, for example, device control “buttons”, such as the Capture button 88 in figures 17 and 18 and the Undo, Fix buttons and OK 90, 92, and 94, respectively, in figure 19. Although device 40 shown in figures 17 to 19 can also include a more complex GUI, and a corresponding control programming in controller 18, in addition to other (s) user input device (s), the device configuration in figures 17 to 19 illustrates basic digital copying and image manipulation functions best suited to a cheaper consumer market for desktop products.
[0053] The examples of system 10 and device 40 presented in the figures, with a camera 14 and a projector 16, do not prevent the use of two or more cameras 14 and / or two or more projectors 16. In fact, it may be desirable, in some applications, a system 10 and a device 40 include more than one camera, more than one projector or more than one of the other components of the system. Capture projection in a collaborative environment
[0054] Figures 20 to 22 illustrate three examples of scenarios for using a capture projection device 40 with other devices. In the usage scenario of figure 20, the capture projection device 40 is connected to a computer workstation 88 and a mobile device 90. In the use scenario of figure 21, multiple capture projection devices 40 are connected one to the other by means of a server 92. In the scenario of use of figure 22, the capture projection devices 40, the computer workstation 88, and the mobile device 90 are connected to each other by means of a server 92. Each connection 94 in figures 20 to 22 generally represents one or more of a cable, wireless, optical fiber, or remote connection via a telecommunication connection, an infrared connection, a radio frequency connection, or any other connection. connector or system that allows electronic communication between connected devices. While individual connections 94 are presented, multiple devices can use the same connection. In addition, other scenarios are possible. For example, multiple capture projection devices 40 can be connected to each other directly, without a server 92.
[0055] In each of the scenarios illustrated in figures 20 to 22, individual users can create, manipulate, transfer and store virtual objects with devices 40, 88, 90 and multiple users can collaborate between devices 40, 88, 90 with one mixture of real and virtual objects. Virtual objects include, for example, digital content rendered as images projected on 40 capture projection devices, as well as digital content rendered on a display in the form of slides, documents, digital photos and the like on other 88 and 90 devices. As noted above with reference to figures 1 A, 1B, 2, and 3, each capture projection device 40 can be configured to project an object image 22 at the same position in the workspace 24 as the position of object 20 when its image was captured by camera 14. In this way, a digital duplicate on a one-to-22 scale of an object 20 can be projected onto the original allowing a digital duplicate in its place to be manipulated, moved, and otherwise altered as desired by remote users collaborating in workspace 12 designed on devices 40 or displayed on devices 88, 90. Any suitable alteration technique can be used, including, by and example, touch recognition and gestures such as “zoom pinch” or input from an IR pen 28 to a projection device 40 and / or changing a digital file to a computerized device 88, 90. The projected image can also be moved away from the original, allowing a user to work with the original and the duplicate together in the same workspace 12.
[0056] Figure 23 is a block diagram illustrating an example of a capture projection device 40 that includes an audio and video teleconference function 96. Referring to figure 23, conference function 96 includes a camera aimed at front 98, microphone 100, and speaker 102. The addition of a conference function 96 allows device 40 to function as a fully functional collaboration tool. The capture projection device 40 shown in figure 23 also includes an object recognition device 104 for distinguishing between real and virtual objects in the workspace. In the example shown, object recognition device 104 includes an infrared camera 106 and an infrared lamp 108. Where camera 14 in the workspace will see real objects and projected images (virtual objects), infrared camera 106 will see only real objects . In this way, the transmission of video (or still image data) from cameras 14 and 106 can be used to distinguish real objects from virtual objects in the workspace, for example, by programming resident in the controller 18.
[0057] An LED, laser or other suitable infrared lamp 108 can be used with the camera 106 to illuminate the workspace to improve object recognition. In addition, while it may be possible to use the same infrared camera for both object recognition (camera 106) and for perceiving an IR pen (camera 30 in figures 4, 15, and 16), it is expected that the camera's frame rate for object recognition it does not normally need to be as high as the frame rate for perceiving the position of the pen, but it may require a higher resolution. Consequently, it may be desirable, for some implementations, to use separate infrared cameras for object recognition and for pen perception. The IR camera 106 is just one example of a suitable object recognition device 104. Other implementations are possible. A depth camera, for example, can be used on device 40 in place of an IR camera to distinguish between real and virtual objects in the workspace.
[0058] Figure 24 illustrates an example of architecture for programming and signal processing to implement a capture projection device 40 of figure 23 in a collaborative environment. In this example, real and virtual objects are managed separately. Real objects captured with a workspace camera 14 are managed using motion video, while virtual objects are managed using static image graphics, such as bitmap images, vector graphics objects, and text. For an independent capture projection device 40, the blocks shown in figure 24 are implemented in controller 18. With reference to figure 24, the video streams from camera 14 of the workspace and the camera pointing forward 98 go to a manager video output 110 where they are sent to devices connected in block 112. Static images from the workspace camera are forwarded to an object manager 114. Object manager 114 is the system component that stores and manipulates digital content, including compositing object images for displaying the projector in block 116 and composing object images for sending to devices connected in block 118. Audio signals from conference microphone 100 go to an output audio manager 120 where they are sent to devices connected in block 112. Entries from devices connected in block 122 are forwarded to the manager appropriate ad - still images are forwarded to object manager 114, videos are forwarded to an input video manager 124 for sending to object composition 116 and projector 16, and audio is forwarded to an incoming audio manager 126 for sending for speaker 102.
[0059] An example to manage the interaction between real and virtual objects will now be described with reference to figures 25 to 28. In this example, real and virtual objects are treated as visually interchangeable logical layers that allow the capture projection device to interpret and control your workspace and, in a collaborative environment, help each user to interact effectively with local and remote objects. Fig. 25 is a block diagram illustrating an example for the implementation of this overlapping technique by means of a programming module 128 present in controller 18 of the capture projection device. Overlay module 128 associates a real object with a visual layer (or a set of layers) and associates a virtual object with another visual layer (or set of layers). As noted above with reference to figure 24, real objects captured with a workspace camera 14 are managed using digital motion video, while virtual objects are managed using static image graphics. Object manager 114 stores and manipulates digital content, including visual overlays implemented in overlay module 128. In one example, each video and digital still image element is associated with a position in an XYZ coordinate system. The overlay module 128 uses position information XYZ to characterize the relative position of each layered element in each plane of the coordinate system - the XY, XZ, and YZ planes. The visual position of each element can then be changed by manipulating the corresponding layer (s). For example, where the XY layer for one object (or element in an object) may initially appear to be above the XY layer for another object, controller 18 can change that visual placement by moving one or both layers in the Z direction.
[0060] Reference will now be made to figures 26 and 27 to help illustrate the overlapping technique. Figure 26 shows a capture projection device 40 and a workspace 12 in which real pieces 130 are positioned on a virtual checkerboard 132 projected on the work surface 24. Workspace 12 also includes virtual pieces 134 projected on the checkerboard 132, for example, using digital content from one or more connected devices 40, 88, 90 in figure 22. Whole lines indicate real objects in figure 26 and dashed lines indicate virtual objects. Figure 27 illustrates overlapping on the Z axis for the objects in Figure 26. The position on the Z axis of double pieces (checkers) is represented by logic layer 136 on layer 138. Layer 138 represents the position of individual pieces on layer 140 Layer 140 represents the position of the virtual checkerboard 132 over layer 142 which represents the position of the work surface 24. Once the position of an object or element in an object is associated with a logical layer, the overlay module 128 (figure 25) maintains status information about the visual order of the layers. As the visual relations are changed, for example, when a piece in figure 26 is moved, the layers are reordered according to the new position data associated with one or more of the digital elements.
[0061] In general, and with reference to the flowchart of figure 28, together with the block diagrams of figures 23 and 25, controller 18 identifies the presence and location of a real object in workspace 12 in block 202, for example, using workspace camera 14 and object recognition device 104. In block 204, overlay module 128 associates the real object with the first logical layer (or a first set of logical layers for three-dimensional spatial positioning ). In block 206, a virtual object (an object image) is projected onto workspace 12 at a location corresponding to a second logical layer. Then, in response to a change in the position of one of the objects or in response to some other user input, the overlay module 128 changes the visual adjacency of the logical layers to reflect the change / entry in block 208. For example, the first layer may initially be visually adjacent to the work surface 24 and the second layer visually adjacent to the first layer. In response to a change / entry, the order of the layers is reversed so that the second layer becomes visually adjacent to the work surface 24 and the first layer becomes visually adjacent to the second layer.
[0062] The use of a capture projection system 10 (by means of a portable device 40, for example) in a collaborative environment as shown in figures 20 to 22, allows each collaborator to see and interact with both real objects and remote. A live video transmission between connected devices shares user inputs in real time through a variety of different input devices and techniques. While employee interaction can be more complete when each participant is using a capture projection device 40 ( or system 10), an effective collaborative environment is still available using different types of devices. An interesting cooperative work is possible where one participant uses a capture projection system 10 and the other participants use devices running the system 10 client programming. In such “mixed” environments, instead of interacting with the system 10 workspace , a participant would use a mouse, keyboard, or touch-sensitive mat to interact through customer programming. Instead of a projection display, client programming on a collaborative 88 or 90 device can use a display window in the computer's workspace. And, instead of a suspended camera, the client application can use screenshots to capture images of objects.
[0063] As noted at the beginning of the present Description, the examples presented in the figures and described above illustrate, but do not limit the invention. Other examples, configurations and implementations are possible. Therefore, the aforementioned description should not be understood as limiting the scope of the invention, which is defined in the following claims.
权利要求:
Claims (14)
[0001]
1. Capture projection system, characterized by the fact that it comprises: - a controller (18); - a workspace camera operatively connected to the controller for capturing still images and video images of an object (20) in a workspace (12); - a projector (16) operatively connected to the controller (18); and the controller (18) configured to control the camera in the workspace camera and the projector (16) to capture an image of a real object in the workspace (12) and to project the image of the object in the workspace (12 ); wherein the controller is configured to control the camera and the projector to capture an image of the object's image as the object's image changes in the workspace and to project the changed image of the object into the workspace; and an object recognition device (104) operatively connected to the controller (18) to distinguish the real object in the workspace (12) from the object image and the altered image of the object projected in the workspace (12).
[0002]
2. System, according to claim 1, characterized by the fact that the controller (18) is configured to control the projector (16) to project the object image in the workspace (12) as a digital scale duplicate of a for one of the real object.
[0003]
3. System, according to claim 2, characterized in that the controller (18) is configured to control the projector (16) to project the image of the object in the workspace (16) in the same position occupied by the real object when the image of the object was captured.
[0004]
4. System according to claim 1, characterized in that the controller (18), the camera, and the projector (16) are stored together as a single portable unit.
[0005]
5. System, according to claim 1, characterized by the fact that it also comprises: - an input / output operatively connected to the controller (18) through which digital information can be received from, and sent to, a device external to the capture projection device; e -a videoconferencing camera, a microphone, and a speaker, each operatively connected to the input / output for sending audio and video information to an external device and for receiving audio and video information from an external device .
[0006]
6. System according to claim 5, characterized by the fact that the controller (18), cameras, projector (16), input / output, microphone, and speaker are stored together as a single unit portable.
[0007]
System according to any one of claims 1 to 6, characterized in that the object recognition device (104) comprises an infrared camera and light source or in which the recognition device (104) comprises a detection camera depth.
[0008]
8. System according to any one of claims 1 to 7, characterized in that the workspace camera is a digital camera, the projector emits white light on the workspace, the object recognition device includes a camera infrared and an infrared light.
[0009]
9. Collaboration method in the workspace, characterized by the fact that it comprises: - capturing a digital image of a real object in a first workspace; - simultaneously project the image of the object in multiple workspaces, including the first workspace; - capture a digital image of an altered object image while it is altered in one of the workspaces; and - simultaneously project the altered object image in multiple workspaces, including the first workspace; and - distinguishing, by an object recognition device (104) operatively connected to the controller, the real object in the workspace from an object image and the altered image of the object projected in the workspace.
[0010]
10. Method, according to claim 9, characterized by the fact that simultaneously projecting the image of the object in multiple workspaces includes projecting the image of the object in the first workspace on the real object.
[0011]
11. Method, according to claim 9, characterized by the fact that projecting the image of the object in multiple workspaces comprises projecting the image of a digital duplicate on a one to one scale of the real object in the multiple workspaces.
[0012]
System according to any one of claims 9 to 11, characterized in that the object recognition device (104) comprises an infrared camera and light source or in which the recognition device (104) comprises a camera depth.
[0013]
13. System according to any of claims 9 to 12, characterized in that the workspace camera is a digital camera, the projector emits white light in the workspace, the object recognition device includes a camera infrared and an infrared light.
[0014]
14. Executable means of processing, characterized by the fact that it includes programming in the same way that when executed it causes the system as defined in claims 1 to 8 to execute a method as defined in claims 9 to 13.
类似技术:
公开号 | 公开日 | 专利标题
BR112014002186B1|2020-12-29|capture projection system, executable means of processing and method of collaboration in the workspace
JP6068392B2|2017-01-25|Projection capturing system and projection capturing method
US9521276B2|2016-12-13|Portable projection capture device
CN107426503B|2020-04-28|Intelligent lighting device
EP3111636B1|2020-07-01|Telepresence experience
JP6078884B2|2017-02-15|Camera-type multi-touch interaction system and method
US9531995B1|2016-12-27|User face capture in projection-based systems
US10869009B2|2020-12-15|Interactive display
BR112014002448B1|2021-12-07|PROJECTION CAPTURE DEVICE
Foote et al.2005|Annotating Remote Reality with iLight
CN112689994A|2021-04-20|Demonstration system and demonstration method
同族专利:
公开号 | 公开日
CN104024936A|2014-09-03|
JP5941146B2|2016-06-29|
WO2013019255A1|2013-02-07|
JP2014529921A|2014-11-13|
KR101795644B1|2017-11-08|
US20140139717A1|2014-05-22|
BR112014002186A2|2017-03-01|
KR20140054025A|2014-05-08|
US9369632B2|2016-06-14|
EP2748675A1|2014-07-02|
US20160255278A1|2016-09-01|
EP2748675A4|2015-03-18|
EP2748675B1|2018-05-23|
US9560281B2|2017-01-31|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US4986651A|1989-08-04|1991-01-22|Minnesota Mining And Manufacturing Company|Overhead projector with centerless Fresnel lens reflective stage|
EP0622722B1|1993-04-30|2002-07-17|Xerox Corporation|Interactive copying system|
GB9614837D0|1996-07-12|1996-09-04|Rank Xerox Ltd|Interactive desktop system with multiple image capture and display modes|
EP0859977A2|1996-09-12|1998-08-26|Eidgenössische Technische Hochschule, Eth Zentrum, Institut für Konstruktion und Bauweisen|Interaction area for data representation|
JPH10222436A|1997-02-12|1998-08-21|Meidensha Corp|Transfer method for program and data|
GB2359895B|2000-03-03|2003-09-10|Hewlett Packard Co|Camera projected viewfinder|
US6965460B1|2000-08-08|2005-11-15|Hewlett-Packard Development Company, L.P.|Method and system for scanning an image using a look-down linear array scanner|
WO2002043390A2|2000-11-06|2002-05-30|Jianbo Shi|Paper-based remote sketching system|
US6431711B1|2000-12-06|2002-08-13|International Business Machines Corporation|Multiple-surface display projector with interactive input capability|
US7259747B2|2001-06-05|2007-08-21|Reactrix Systems, Inc.|Interactive video display system|
US7710391B2|2002-05-28|2010-05-04|Matthew Bell|Processing an image utilizing a spatially varying pattern|
JP2003152851A|2001-11-14|2003-05-23|Nec Corp|Portable terminal|
EP1387211B1|2002-07-15|2005-05-25|Sony International GmbH|Image recording device combined with image projection capabilities|
US20040095562A1|2002-11-20|2004-05-20|John Moffatt|Combination scanner/projector|
JP3927168B2|2002-11-25|2007-06-06|日本電信電話株式会社|Real world object recognition method and real world object recognition system|
US6840627B2|2003-01-21|2005-01-11|Hewlett-Packard Development Company, L.P.|Interactive display device|
US7203384B2|2003-02-24|2007-04-10|Electronic Scripting Products, Inc.|Implement for optically inferring information from a planar jotting surface|
JP4401728B2|2003-09-30|2010-01-20|キヤノン株式会社|Mixed reality space image generation method and mixed reality system|
US20050078092A1|2003-10-08|2005-04-14|Clapper Edward O.|Whiteboard desk projection display|
US7110100B2|2003-11-04|2006-09-19|Electronic Scripting Products, Inc.|Apparatus and method for determining an inclination of an elongate object contacting a plane surface|
US7268956B2|2003-11-24|2007-09-11|Electronic Scripting Products, Inc.|Solid catadioptric lens with two viewpoints|
US7038846B2|2003-11-24|2006-05-02|Electronic Scripting Products, Inc.|Solid catadioptric lens with a single viewpoint|
US7088440B2|2003-12-22|2006-08-08|Electronic Scripting Products, Inc.|Method and apparatus for determining absolute position of a tip of an elongate object on a plane surface with invariant features|
US7826641B2|2004-01-30|2010-11-02|Electronic Scripting Products, Inc.|Apparatus and method for determining an absolute pose of a manipulated object in a real three-dimensional environment with invariant features|
US7961909B2|2006-03-08|2011-06-14|Electronic Scripting Products, Inc.|Computer interface employing a manipulated object with absolute pose detection component and a display|
US8542219B2|2004-01-30|2013-09-24|Electronic Scripting Products, Inc.|Processing pose data derived from the pose of an elongate object|
US9229540B2|2004-01-30|2016-01-05|Electronic Scripting Products, Inc.|Deriving input from six degrees of freedom interfaces|
JP4533641B2|2004-02-20|2010-09-01|オリンパス株式会社|Portable projector|
US7023536B2|2004-03-08|2006-04-04|Electronic Scripting Products, Inc.|Apparatus and method for determining orientation parameters of an elongate object|
US7161664B2|2004-04-13|2007-01-09|Electronic Scripting Products, Inc.|Apparatus and method for optical determination of intermediate distances|
US7432917B2|2004-06-16|2008-10-07|Microsoft Corporation|Calibration of an interactive display system|
US7113270B2|2004-06-18|2006-09-26|Electronics Scripting Products, Inc.|Determination of an orientation parameter of an elongate object with a scan beam apparatus|
US7519223B2|2004-06-28|2009-04-14|Microsoft Corporation|Recognizing gestures and using gestures for interacting with software applications|
US7557966B2|2004-08-11|2009-07-07|Acushnet Company|Apparatus and method for scanning an object|
US20060126128A1|2004-12-15|2006-06-15|Lexmark International, Inc.|Scanning assembly|
JP4568121B2|2005-01-04|2010-10-27|日本電信電話株式会社|Remote work support system, remote work support method and program|
EP1686554A3|2005-01-31|2008-06-18|Canon Kabushiki Kaisha|Virtual space generating system, image processing apparatus and information processing method|
CN101208738B|2005-04-11|2011-11-09|波利维森有限公司|Automatic projection calibration|
JP4670053B2|2005-06-21|2011-04-13|幾朗 長|Actual size display device|
EP1898260A4|2005-06-30|2014-07-09|Ricoh Co Ltd|Projection image display device|
JP4657060B2|2005-08-25|2011-03-23|シャープ株式会社|projector|
BRPI0615283A2|2005-08-29|2011-05-17|Evryx Technologies Inc|interactivity through mobile image recognition|
US7599561B2|2006-02-28|2009-10-06|Microsoft Corporation|Compact interactive tabletop with projection-vision|
US7729515B2|2006-03-08|2010-06-01|Electronic Scripting Products, Inc.|Optical navigation apparatus using fixed beacons and a centroid sensing device|
US20080018591A1|2006-07-20|2008-01-24|Arkady Pittel|User Interfacing|
JP4777182B2|2006-08-01|2011-09-21|キヤノン株式会社|Mixed reality presentation apparatus, control method therefor, and program|
US7690795B2|2006-10-06|2010-04-06|Hewlett-Packard Development Company, L.P.|Projector/camera system|
JP2008227883A|2007-03-13|2008-09-25|Brother Ind Ltd|Projector|
US8199117B2|2007-05-09|2012-06-12|Microsoft Corporation|Archive for physical and digital objects|
US9377874B2|2007-11-02|2016-06-28|Northrop Grumman Systems Corporation|Gesture recognition light and video image projector|
JP5277703B2|2008-04-21|2013-08-28|株式会社リコー|Electronics|
CN101261557B|2008-04-30|2011-09-14|北京汇冠新技术股份有限公司|Image sensing apparatus for touch screen|
JP2010154361A|2008-12-25|2010-07-08|Kyocera Corp|Projection apparatus, radio communication terminal, slide image browsing system, and slide image browsing method|
US8355038B2|2009-01-28|2013-01-15|Hewlett-Packard Development Company, L.P.|Systems for capturing images through a display|
US8121640B2|2009-03-19|2012-02-21|Microsoft Corporation|Dual module portable devices|
US20100271394A1|2009-04-22|2010-10-28|Terrence Dashon Howard|System and method for merging virtual reality and reality to provide an enhanced sensory experience|
JP5395507B2|2009-05-21|2014-01-22|キヤノン株式会社|Three-dimensional shape measuring apparatus, three-dimensional shape measuring method, and computer program|
KR101229807B1|2009-06-26|2013-02-05|광주과학기술원|Novel Drug Target against Hypersensitivity Immune or Inflammatory Disease|
KR20110003705A|2009-07-06|2011-01-13|엘지전자 주식회사|Method for displaying information in mobile terminal and mobile terminal using the same|
GB2469346B|2009-07-31|2011-08-10|Promethean Ltd|Calibration of interactive whiteboard|
JP2011081556A|2009-10-06|2011-04-21|Sony Corp|Information processor, method of processing information, program, and server|
US8842096B2|2010-01-08|2014-09-23|Crayola Llc|Interactive projection system|
US8490002B2|2010-02-11|2013-07-16|Apple Inc.|Projected display shared workspaces|
TWI423096B|2010-04-01|2014-01-11|Compal Communication Inc|Projecting system with touch controllable projecting picture|
US8751049B2|2010-05-24|2014-06-10|Massachusetts Institute Of Technology|Kinetic input/output|
US8736583B2|2011-03-29|2014-05-27|Intel Corporation|Virtual links between different displays to present a single virtual object|
US8928735B2|2011-06-14|2015-01-06|Microsoft Corporation|Combined lighting, projection, and image capture without video feedback|
US8842057B2|2011-09-27|2014-09-23|Z124|Detail on triggers: transitional states|
US8970709B2|2013-03-13|2015-03-03|Electronic Scripting Products, Inc.|Reduced homography for recovery of pose parameters of an optical apparatus producing image data with structural uncertainty|US9317109B2|2012-07-12|2016-04-19|Mep Tech, Inc.|Interactive image projection accessory|
US20110165923A1|2010-01-04|2011-07-07|Davis Mark L|Electronic circle game system|
WO2011094292A1|2010-01-28|2011-08-04|Pathway Innovations And Technologies, Inc.|Document imaging system having camera-scanner apparatus and personal computer based processing software|
EP2737693B1|2011-07-29|2020-01-08|Hewlett-Packard Development Company, L.P.|System and method of visual layering|
SE1200428A1|2012-07-09|2012-10-22|Electrolux Ab|Appliance for the kitchen|
GB2505708B|2012-09-11|2015-02-25|Barco Nv|Projection system with safety detection|
KR102001218B1|2012-11-02|2019-07-17|삼성전자주식회사|Method and device for providing information regarding the object|
JP5787099B2|2012-11-06|2015-09-30|コニカミノルタ株式会社|guidance information display device|
US9753534B2|2012-11-09|2017-09-05|Sony Corporation|Information processing apparatus, information processing method, and computer-readable recording medium|
JP6167511B2|2012-12-04|2017-07-26|セイコーエプソン株式会社|Document camera and document camera control method|
CN103869961A|2012-12-18|2014-06-18|联想有限公司|Method and system for interacting projector and video camera|
US20140307055A1|2013-04-15|2014-10-16|Microsoft Corporation|Intensity-modulated light pattern for active stereo|
US9866768B1|2013-04-29|2018-01-09|The United States Of America, As Represented By The Secretary Of Agriculture|Computer vision qualified infrared temperature sensor|
US20160077670A1|2013-07-31|2016-03-17|Hewlett-Packard Development Company, L.P.|System with projector unit and computer|
US20150049078A1|2013-08-15|2015-02-19|Mep Tech, Inc.|Multiple perspective interactive image projection|
EP3036602A4|2013-08-22|2017-04-12|Hewlett-Packard Development Company, L.P.|Projective computing system|
US10114512B2|2013-09-30|2018-10-30|Hewlett-Packard Development Company, L.P.|Projection system manager|
JP6349838B2|2014-01-21|2018-07-04|セイコーエプソン株式会社|POSITION DETECTION DEVICE, POSITION DETECTION SYSTEM, AND POSITION DETECTION DEVICE CONTROL METHOD|
WO2015116157A2|2014-01-31|2015-08-06|Hewlett-Packard Development Company, L.P.|Display unit manager|
CN105940359B|2014-01-31|2020-10-20|惠普发展公司,有限责任合伙企业|Touch sensitive pad for system with projector unit|
US9489724B2|2014-03-31|2016-11-08|The Boeing Company|Three-dimensional stereoscopic projection on complex surfaces|
US9355498B2|2014-06-19|2016-05-31|The Boeing Company|Viewpoint control of a display of a virtual product in a virtual environment|
US20160028781A1|2014-07-22|2016-01-28|International Business Machines Corporation|Surface computing based social interaction|
US20170219915A1|2014-07-31|2017-08-03|Hewlett-Packard Development Company, L.P.|White flash generation from a light emitting diodeprojector|
WO2016018388A1|2014-07-31|2016-02-04|Hewlett-Packard Development Company, L.P.|Implicitly grouping annotations with a document|
WO2016018406A1|2014-07-31|2016-02-04|Hewlett-Packard Development Company, L.P.|Image projection and capture with adjustment for white point|
US10664090B2|2014-07-31|2020-05-26|Hewlett-Packard Development Company, L.P.|Touch region projection onto touch-sensitive surface|
CN106796446B|2014-08-04|2020-05-12|惠普发展公司,有限责任合伙企业|Workspace metadata management|
KR101566543B1|2014-09-03|2015-11-05|재단법인 실감교류인체감응솔루션연구단|Method and system for mutual interaction using space information argumentation|
WO2016039713A1|2014-09-08|2016-03-17|Hewlett-Packard Development Company, L.P.|Capture and projection of an object image|
EP3191918B1|2014-09-12|2020-03-18|Hewlett-Packard Development Company, L.P.|Developing contextual information from an image|
EP3198366B1|2014-09-24|2021-01-06|Hewlett-Packard Development Company, L.P.|Transforming received touch input|
WO2016053311A1|2014-09-30|2016-04-07|Hewlett Packard Enterprise Development Lp|Artifact projection|
CN107077196B|2014-09-30|2020-01-21|惠普发展公司,有限责任合伙企业|Identifying objects on a touch-sensitive surface|
CN107407959B|2014-09-30|2021-04-30|惠普发展公司,有限责任合伙企业|Manipulation of three-dimensional images based on gestures|
US9961265B2|2014-10-06|2018-05-01|Shafiq Ahmad Chaudhry|Method for capturing and storing historic audiovisual data via a digital mirror|
WO2016068890A1|2014-10-28|2016-05-06|Hewlett-Packard Development Company, L.P.|Image data segmentation|
WO2016076874A1|2014-11-13|2016-05-19|Hewlett-Packard Development Company, L.P.|Image projection|
US20160173840A1|2014-12-10|2016-06-16|Casio Computer Co., Ltd.|Information output control device|
CN106033257B|2015-03-18|2019-05-31|联想有限公司|A kind of control method and device|
US10210607B1|2015-04-08|2019-02-19|Wein Holding LLC|Digital projection system and method for workpiece assembly|
US20160316113A1|2015-04-27|2016-10-27|Microsoft Technology Licensing, Llc|Integrated processing and projection device with object detection|
US10306193B2|2015-04-27|2019-05-28|Microsoft Technology Licensing, Llc|Trigger zones for objects in projected surface model|
US20160329006A1|2015-05-04|2016-11-10|Microsoft Technology Licensing, Llc|Interactive integrated display and processing device|
US20170090272A1|2015-05-12|2017-03-30|Muneer Ayaad|Foldable camera and projector with code activated controls|
JP6808623B2|2015-07-08|2021-01-06|ソニーセミコンダクタソリューションズ株式会社|Solid-state image sensor and drive method, and electronic devices|
US20180091733A1|2015-07-31|2018-03-29|Hewlett-Packard Development Company, L.P.|Capturing images provided by users|
US10580126B1|2016-01-14|2020-03-03|Wein Holding LLC|Automated system and method for lumber analysis|
WO2017143303A1|2016-02-17|2017-08-24|Meta Company|Apparatuses, methods and systems for sharing virtual elements|
TWI653563B|2016-05-24|2019-03-11|仁寶電腦工業股份有限公司|Projection touch image selection method|
JP2018006818A|2016-06-27|2018-01-11|キヤノン株式会社|Image reading method and image reading device|
US10957500B2|2016-07-15|2021-03-23|Apple Inc.|Keyboard backlighting with reduced driver circuitry|
WO2018022130A1|2016-07-27|2018-02-01|Scannmen Ltd.|Hybrid 3d optical scanning system|
KR20190050774A|2016-09-13|2019-05-13|소니 주식회사|Display device with detection function|
US10493636B1|2016-09-26|2019-12-03|Wein Holding LLC|Automated system and method for lumber picking|
CN107966781B|2017-12-25|2020-03-27|中国航空工业集团公司洛阳电光设备研究所|Reflector folding, unfolding and adjusting device in perspective type display optical system|
DE102018203344A1|2018-03-07|2019-09-12|BSH Hausgeräte GmbH|Interaction module|
TWI656362B|2018-03-26|2019-04-11|仁寶電腦工業股份有限公司|Electronic device and object rework method thereof|
CN110719429B|2018-07-12|2022-01-11|视联动力信息技术股份有限公司|High-speed shooting instrument processing method and device based on video network|
FR3084173A1|2018-07-18|2020-01-24|Holomake|MOTORIZED MECHANICAL SERVO SYSTEM OF A HOLOGRAPHIC PLAN FOR MANUAL PRECISION GUIDANCE|
WO2020027818A1|2018-07-31|2020-02-06|Hewlett-Packard Development Company, L.P.|Determining location of touch on touch sensitive surfaces|
法律状态:
2018-12-18| B06F| Objections, documents and/or translations needed after an examination request according art. 34 industrial property law|
2019-11-19| B06U| Preliminary requirement: requests with searches performed by other patent offices: suspension of the patent application procedure|
2020-11-17| B09A| Decision: intention to grant|
2020-12-29| B16A| Patent or certificate of addition of invention granted|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 02/11/2011, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
USPCT/US2011/045983|2011-07-29|
PCT/US2011/045983|WO2013019190A1|2011-07-29|2011-07-29|System and method of visual layering|
USPCT/US2011/046253|2011-08-02|
PCT/US2011/046253|WO2013019217A1|2011-08-02|2011-08-02|Projection capture system and method|
PCT/US2011/053947|WO2013019252A1|2011-08-02|2011-09-29|Portable projection capture device|
USPCT/US2011/053947|2011-09-29|
PCT/US2011/058896|WO2013019255A1|2011-07-29|2011-11-02|Projection capture system, programming and method|
[返回顶部]